21 research outputs found

    The Use of Computational Methods in the Grouping and Assessment of Chemicals - Preliminary Investigations

    Get PDF
    This document presents a perspective of how computational approaches could potentially be used in the grouping and assessment of chemicals, and especially in the application of read-across and the development of chemical categories. The perspective is based on experience gained by the authors during 2006 and 2007, when the Joint Research Centre's European Chemicals Bureau was directly involved in the drafting of technical guidance on the applicability of computational methods under REACH. Some of the experience gained and ideas developed resulted from a number of research-based case studies conducted in-house during 2006 and the first half of 2007. The case studies were performed to explore the possible applications of computational methods in the assessment of chemicals and to contribute to the development of technical guidance. Not all of the methods explored and ideas developed are explicitly included in the final guidance documentation for REACH. Many of the methods are novel, and are still being refined and assessed by the scientific community. At present, many of the methods have not been tried and tested in the regulatory context. The authors therefore hope that the perspective and case studies compiled in this document, whilst not intended to serve as guidance, will nevertheless provide an input to further research efforts aimed at developing computational methods, and at exploring their potential applicability in regulatory assessment of chemicals.JRC.I.3-Toxicology and chemical substance

    Early peripheral clearance of leukemia-associated immunophenotypes in AML: centralized analysis of a randomized trial

    Get PDF
    Although genetics is a relevant risk factor in acute myeloid leukemia (AML), it can be minimally informative and/or not readily available for the early identification of patients at risk for treatment failure. In a randomized trial comparing standard vs high-dose induction (ClinicalTrials.gov 64NCT00495287), we studied early peripheral blast cell clearance (PBC) as a rapid predictive assay of chemotherapy response to determine whether it correlates with the achievement of complete remission (CR), as well as postremission outcome, according to induction intensity. Individual leukemia-associated immunophenotypes (LAIPs) identified pretherapy by flow cytometry were validated and quantified centrally after 3 days of treatment, expressing PBC on a logarithmic scale as the ratio of absolute LAIP1 cells on day 1 and day 4. Of 178 patients, 151 (84.8%) were evaluable. Patients in CR exhibited significantly higher median PBC (2.3 log) compared with chemoresistant patients (1.0 log; P<.0001). PBC<1.0 predicted the worst outcome (CR, 28%). With 1.5 log established as the most accurate cutoff predicting CR, 87.5% of patients with PBC .1.5 (PBChigh, n = 96) and 43.6% of patients with PBC 641.5 (PBClow, n = 55) achieved CR after single-course induction (P<.0001). CR and PBChigh rates were increased in patients randomized to the high-dose induction arm (P 5 .04) and correlated strongly with genetic/cytogenetic risk. In multivariate analysis, PBC retained significant predictive power for CR, relapse risk, and survival. Thus, PBC analysis can provide a very early prediction of outcome, correlates with treatment intensity and disease subset, and may support studies of customized AML therapy

    Chemical Similarity and Threshold of Toxicological Concern (TTC) Approaches: Report of an ECB Workshop held in Ispra, November 2005

    Get PDF
    There are many national, regional and international programmes – either regulatory or voluntary – to assess the hazards or risks of chemical substances to humans and the environment. The first step in making a hazard assessment of a chemical is to ensure that there is adequate information on each of the endpoints. If adequate information is not available then additional data is needed to complete the dataset for this substance. For reasons of resources and animal welfare, it is important to limit the number of tests that have to be conducted, where this is scientifically justifiable. One approach is to consider closely related chemicals as a group, or chemical category, rather than as individual chemicals. In a category approach, data for chemicals and endpoints that have been already tested are used to estimate the hazard for untested chemicals and endpoints. Categories of chemicals are selected on the basis of similarities in biological activity which is associated with a common underlying mechanism of action. A homologous series of chemicals exhibiting a coherent trend in biological activity can be rationalised on the basis of a constant change in structure. This type of grouping is relatively straightforward. The challenge lies in identifying the relevant chemical structural and physicochemical characteristics that enable more sophisticated groupings to be made on the basis of similarity in biological activity and hence purported mechanism of action. Linking two chemicals together and rationalising their similarity with reference to one or more endpoints has been very much carried out on an ad hoc basis. Even with larger groups, the process and approach is ad hoc and based on expert judgement. There still appears to be very little guidance about the tools and approaches for grouping chemicals systematically. In November 2005, the ECB Workshop on Chemical Similarity and Thresholds of Toxicological Concern (TTC) Approaches was convened to identify the available approaches that currently exist to encode similarity and how these can be used to facilitate the grouping of chemicals. This report aims to capture the main themes that were discussed. In particular, it outlines a number of different approaches that can facilitate the formation of chemical groupings in terms of the context under consideration and the likely information that would be required. Grouping methods were divided into one of four classes – knowledge-based, analogue-based, unsupervised, and supervised. A flowchart was constructed to attempt to capture a possible work flow to highlight where and how these approaches might be best applied.JRC.I.3-Toxicology and chemical substance

    The application of molecular modelling in the safety assessment of chemicals: A case study on ligand-dependent PPARÎł dysregulation.

    Get PDF
    The aim of this paper was to provide a proof of concept demonstrating that molecular modelling methodologies can be employed as a part of an integrated strategy to support toxicity prediction consistent with the mode of action/adverse outcome pathway (MoA/AOP) framework. To illustrate the role of molecular modelling in predictive toxicology, a case study was undertaken in which molecular modelling methodologies were employed to predict the activation of the peroxisome proliferator-activated nuclear receptor Îł (PPARÎł) as a potential molecular initiating event (MIE) for liver steatosis. A stepwise procedure combining different in silico approaches (virtual screening based on docking and pharmacophore filtering, and molecular field analysis) was developed to screen for PPARÎł full agonists and to predict their transactivation activity (EC50). The performance metrics of the classification model to predict PPARÎł full agonists were balanced accuracy=81%, sensitivity=85% and specificity=76%. The 3D QSAR model developed to predict EC50 of PPARÎł full agonists had the following statistical parameters: q(2)cv=0.610, Nopt=7, SEPcv=0.505, r(2)pr=0.552. To support the linkage of PPARÎł agonism predictions to prosteatotic potential, molecular modelling was combined with independently performed mechanistic mining of available in vivo toxicity data followed by ToxPrint chemotypes analysis. The approaches investigated demonstrated a potential to predict the MIE, to facilitate the process of MoA/AOP elaboration, to increase the scientific confidence in AOP, and to become a basis for 3D chemotype development

    Theoretical studies of mononuclear non-heme iron active sites

    No full text
    The quantum chemical investigations presented in this thesis use hybrid density functional theory to shed light on the catalytic mechanisms of mononuclear non-heme iron oxygenases, accommodating a ferrous ion in their active sites. More specifically, the dioxygen activation process and the subsequent oxidative reactions in the following enzymes were studied: tetrahydrobiopterin-dependent hydroxylases, naphthalene 1,2-dioxygenase and α-ketoglutarate-dependent enzymes. In light of many experimental efforts devoted to the functional mimics of non-heme iron oxygenases, the reactivity of functional analogues was also examined. The computed energetics and the available experimental data served to assess the feasibility of the reaction mechanisms investigated. Dioxygen activation in tetrahydrobiopterin- and α-ketoglutarate-dependent enzymes were found to involve a high-valent iron-oxo species, which was then capable of substrate hydroxylation. In the case of naphthalene 1,2-dioxygenase, the reactivity of an iron(III)-hydroxperoxo species toward the substrate was investigated and compared to the biomimetic counterpart

    Computational Tools for Regulatory Needs

    No full text
    In the regulatory framework there is a growing need for in silico methods that can be used to gain information about environmental fate, and ecological and health effects of chemicals. Computer-aided toxicity prediction mainly makes use of the relationship between chemical structure and biological activity to compute (eco)toxicity and fate of chemicals (e. g., physicochemical properties, toxicological activity, distribution, and fate), thus generating non-testing data regarding the effects of the chemicals on man and the environment. The different techniques that are used to derive non-testing information include (Quantitative) Structure Activity Relationship models, expert systems, and read-across/category approaches. These “non-testing methods” rely on the idea that the biological activity of the chemical depends on its intrinsic nature and it can be directly inferred from its molecular structure and the properties of similar compounds whose activities are known. In principle, non-testing methods can be applied at different stages in the development and registration of chemicals, from in-house research and development to the compilation of dossiers on chemical safety for submission to regulatory authorities. In practise, the ways in which these approaches are used depends on the requirements of the specific legislation and the possibilities offered by regulatory authorities. This chapter focuses on the use of non-testing methods in the regulatory assessment of chemicals. A brief explanation is provided of the main types of non-testing methods, and reference is made to the use these methods in the European Union (EU), as foreseen by the REACH legislation. The uptake of non-testing methods in the EU is an example of a trend across many countries within the Organisation for Economic Cooperation and Development (OECD).The need to use non-testing methods has led to the development and implementation of Integrated Testing Strategies (ITS) based as far as possible on the use of non-testing data [2]. The use of non-testing methods within such strategies implies the need for computational tools to facilitate the entire workflow. This chapter presents the views of the authors on the main functionalities that should be incorporated into a Decision Support System (DSS), to facilitate the implementation of this workflow. Such a DSS is intended to be useful in any regulatory context.JRC.I.3-Toxicology and chemical substance

    The Integrated Use of Models for the Properties and Effects of Chemicals by means of a Structured Workflow

    No full text
    This paper reviews the applicability of different types of non-testing methods and in silico tools in the framework of a structured workflow that aids their exploitation for the prediction of properties that contribute to hazard and risk assessments of chemicals. These properties include basic physicochemical properties, metabolic and environmental fate, and ecological and health effects of chemicals. The workflow for the use of methods comprises a structured sequence of operations that integrates the functionalities of a wide array of in silico tools. The workflow could be used for in-house decision making (e.g. screening the properties of potential drugs and commercial chemicals) as well generating data required in regulatory submissions. The general workflow presented here is intended to broadly applicable to all endpoints and different regulatory frameworks, including the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) legislation in the European Union. The general framework can be adapted to meet the needs of specific chemicals, endpoints and regulatory purposes. This review is one of a series of mini-reviews in this journal.JRC.I.3-Consumer products safety and qualit

    Role of in silico genotoxicity tools in the regulatory assessment of pharmaceutical impurities

    No full text
    The toxicological assessment of genotoxic impurities is an important consideration in the regulatory framework for pharmaceuticals. In this context, the application of promising computational methods (e.g. Quantitative Structure-Activity Relationships (QSARs), Structure-Activity Relationships (SARs) and/or expert systems) for the evaluation of genotoxicity is needed, especially when very limited information on impurities is available, both for practical reasons and to respect the principle of the 3Rs (Replacement, Reduction and Refinement) of animal use. To gain an overview of how computational methods are used internationally in the regulatory assessment of pharmaceutical impurities, the current regulatory documents were reviewed. The software recommended in the guidelines (e.g. MCASE, MC4PC, Derek for Windows) or, practically used by various regulatory agencies (e.g. U.S. Food and Drug Administration, U.S. and Danish Environmental Protection Agencies), as well as the other existing programs were analysed, highlighting their benefits and limitations. Both statistically-based and knowledge-based (expert system) tools were analysed. Information on the models’ training sets as well as their applicability domains was retrieved. The overall conclusions on the available in silico tools for genotoxicity and carcinogenicity prediction are quite optimistic and the regulatory application of QSAR methods is constantly growing. For regulatory purposes, it is recommended that the predictions of genotoxicity/carcinogenicity should be based on a battery of models, combining high sensitivity models (low rate of false negatives) with high specificity ones (low rate of false positives), and in vitro assays in an integrated manner.JRC.I.5-Systems Toxicolog
    corecore